# Lightweight NLU
Rubert Tiny
MIT
An extremely compact distilled version (45MB, 12M parameters) of the bert-base-multilingual-cased model for Russian and English, prioritizing speed and size over absolute accuracy
Large Language Model
Transformers Supports Multiple Languages

R
cointegrated
36.18k
41
Deberta V3 Small
MIT
DeBERTa-v3 is an improved natural language understanding model developed by Microsoft, optimized through ELECTRA-style pretraining and gradient-disentangled embedding sharing technology to achieve efficient performance while maintaining a relatively small parameter count.
Large Language Model
Transformers English

D
microsoft
189.23k
51
Featured Recommended AI Models